home *** CD-ROM | disk | FTP | other *** search
- Path: news.cs.indiana.edu!djacobso@cs.indiana.edu
- From: "dan j" <djacobso@cs.indiana.edu>
- Newsgroups: comp.lang.c
- Subject: microsecond delay
- Date: Thu, 28 Mar 1996 15:56:46 -0500 (EST)
- Organization: Computer Science, Indiana University
- Message-ID: <14546@828046609>
- NNTP-Posting-Host: news.cs.indiana.edu
-
- We are writing a program that reads bar codes swiped on a card reader.
-
- The problem is that when we write to the device on the serial line,
- the following read is too fast for the device. We have to delay the
- read briefly. We can use "sleep" to delay a second, but that's too
- long.
-
- Is there any way effect delays in miliseconds without resorting to
- loops?
-
- Many thanks,
-
- dan jacobson
-
-